Compressed Spectral Regression for Efficient Nonlinear Dimensionality Reduction

نویسنده

  • Deng Cai
چکیده

Spectral dimensionality reduction methods have recently emerged as powerful tools for various applications in pattern recognition, data mining and computer vision. These methods use information contained in the eigenvectors of a data affinity (i.e., item-item similarity) matrix to reveal the low dimensional structure of the high dimensional data. One of the limitations of various spectral dimensionality reduction methods is their high computational complexity. They all need to construct a data affinity matrix and compute the top eigenvectors. This leads to O(n) computational complexity, where n is the number of samples. Moreover, when the data are highly non-linear distributed, some linear methods have to be performed in a reproducing kernel Hilbert space (leads to the corresponding kernel methods) to learn an effective non-linear mapping. The computational complexity of these kernel methods is O(n). In this paper, we propose a novel nonlinear dimensionality reduction algorithm, called Compressed Spectral Regression, withO(n) computational complexity. Extensive experiments on data clustering demonstrate the effectiveness and efficiency of the proposed approach.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Spectral Regression Discriminant Analysis for Hyperspectral Image Classification

Dimensionality reduction algorithms, which aim to select a small set of efficient and discriminant features, have attracted great attention for Hyperspectral Image Classification. The manifold learning methods are popular for dimensionality reduction, such as Locally Linear Embedding, Isomap, and Laplacian Eigenmap. However, a disadvantage of many manifold learning methods is that their computa...

متن کامل

Spectral Regression dimension reduction for multiple features facial image retrieval

Face retrieval has received much attention in recent years. This paper comparatively studied five feature description methods for face representation, including Local Binary Pattern (LBP), Gabor feature, Gray Level Co-occurrence Matrices (GLCM), Pyramid Histogram of Oriented Gradient (PHOG) and Curvelet Transform (CT). The problem of large dimensionalities of the extracted features was addresse...

متن کامل

Spectral Regression : a Regression Framework for Efficient Regularized Subspace Learning

Spectral methods have recently emerged as a powerful tool for dimensionality reduction and manifold learning. These methods use information contained in the eigenvectors of a data affinity (i.e., item-item similarity) matrix to reveal the low dimensional structure in the high dimensional data. The most popular manifold learning algorithms include Locally Linear Embedding, ISOMAP, and Laplacian ...

متن کامل

Spectral Regression for Dimensionality Reduction by Deng Cai , Xiaofei He , and Jiawei Han May 2007

Spectral methods have recently emerged as a powerful tool for dimensionality reduction and manifold learning. These methods use information contained in the eigenvectors of a data affinity (i.e., item-item similarity) matrix to reveal low dimensional structure in high dimensional data. The most popular manifold learning algorithms include Locally Linear Embedding, Isomap, and Laplacian Eigenmap...

متن کامل

Spectral Regression for Dimensionality Reduction∗

Spectral methods have recently emerged as a powerful tool for dimensionality reduction and manifold learning. These methods use information contained in the eigenvectors of a data affinity (i.e., item-item similarity) matrix to reveal low dimensional structure in high dimensional data. The most popular manifold learning algorithms include Locally Linear Embedding, Isomap, and Laplacian Eigenmap...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2015